Any-Space Probabilistic Inference
نویسنده
چکیده
We have recently introduced an any-space algorithm for exact inference in Bayesian networks, called Recursive Conditioning, RC, which allows one to trade space with time at increments of X-bytes, where X is the number of bytes needed to cache a floating point number. In this paper, we present three key extensions of RC. First, we modify the algorithm so it applies to more general factorizations of probability distributions, including (but not limited to ) Bayesian network factorizations. Sec ond, we present a forgetting mechanism which reduces the space requirements of RC considerably and then compare such requirements with those of variable elim ination on a number of realistic networks, showing orders of magnitude improvements in certain cases. Third, we present a ver sion of RC for computing maximum a pos teriori hypotheses (MAP}, which turns out to be the first MAP algorithm allowing a smooth time-space tradeoff. A key advan tage of the presented MAP algorithm is that it does not have to start from scratch each time a new query is presented, but can reuse some of its computations across mul tiple queries, leading to significant savings in certain cases.
منابع مشابه
Practical Structures for Inference in Bayesian Networks
Programmers employing inference in Bayesian networks typically rely on the inclusion of the model as well as an inference engine into their application. Sophisticated inference engines require non-trivial amounts of space and are also difficult to implement. This limits their use in some applications that would otherwise benefit from probabilistic inference. This paper presents a system that mi...
متن کاملMenger probabilistic normed space is a category topological vector space
In this paper, we formalize the Menger probabilistic normed space as a category in which its objects are the Menger probabilistic normed spaces and its morphisms are fuzzy continuous operators. Then, we show that the category of probabilistic normed spaces is isomorphicly a subcategory of the category of topological vector spaces. So, we can easily apply the results of topological vector spaces...
متن کاملSigma-Point Kalman Filters for Probabilistic Inference in Dynamic State-Space Models
Probabilistic inference is the problem of estimating the hidden states of a system in an optimal and consistent fashion given a set of noisy or incomplete observations. The optimal solution to this problem is given by the recursive Bayesian estimation algorithm which recursively updates the posterior density of the system state as new observations arrive online. This posterior density constitut...
متن کاملThe Hilbert Space of Probability Mass Functions and Applications on Probabilistic Inference
THE HILBERT SPACE OF PROBABILITY MASS FUNCTIONS AND APPLICATIONS ON PROBABILISTIC INFERENCE Bayramog̃lu, Muhammet Fatih Ph.D., Department of Electrical and Electronics Engineering Supervisor : Assoc. Prof. Dr. Ali Özgür Yılmaz September 2011, 123 pages The Hilbert space of probability mass functions (pmf) is introduced in this thesis. A factorization method for multivariate pmfs is proposed by u...
متن کاملOptimal Time-Space Tradeoff in Probabilistic Inference
Recursive Conditioning, RC, is an any-space algorithm lor exact inference in Bayesian networks, which can trade space for time in increments of the size of a floating point number. This smooth tradeoff' is possible by varying the algorithm's cache size. When RC is run with a constrained cache size, an important problem arises: Which specific results should be cached in order to minimize the run...
متن کاملAttention as inference: selection is probabilistic; responses are all-or-none samples.
Theories of probabilistic cognition postulate that internal representations are made up of multiple simultaneously held hypotheses, each with its own probability of being correct (henceforth, "probability distributions"). However, subjects make discrete responses and report the phenomenal contents of their mind to be all-or-none states rather than graded probabilities. How can these 2 positions...
متن کامل